Statistical Estimation of the Kullback–Leibler Divergence

نویسندگان

چکیده

Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) Lebesgue measure. These based on certain k-nearest neighbor statistics pair independent identically distributed (i.i.d.) due vector samples. The novelty results is also treating mixture models. In particular, they cover mixtures nondegenerate Gaussian measures. mentioned asymptotic properties related estimators Shannon entropy cross-entropy strengthened. Some applications indicated.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

Statistical Topology Using the Nonparametric Density Estimation and Bootstrap Algorithm

This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply kernel density approach to estimate the persistence landscape. In addition, we evaluate the quality distribution function estimator of random variables using integrated mean square error (IMSE). The results of simulation studies show a significant impro...

متن کامل

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

Nonparametric Divergence Estimation

A. The von Mises Expansion Before diving into the auxiliary results of Section 5, let us first derive some properties of the von Mises expansion. It is a simple calculation to verify that the Gateaux derivative is simply the functional derivative of in the event that T (F ) = R (f). Lemma 8. Let T (F ) = R (f)dμ where f = dF/dμ is the Radon-Nikodym derivative, is differentiable and let G be som...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2021

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math9050544